Double-bagging: combining classifiers by bootstrap aggregation
نویسندگان
چکیده
The combination of classi"ers leads to substantial reduction of misclassi"cation error in a wide range of applications and benchmark problems. We suggest using an out-of-bag sample for combining di0erent classi"ers. In our setup, a linear discriminant analysis is performed using the observations in the out-of-bag sample, and the corresponding discriminant variables computed for the observations in the bootstrap sample are used as additional predictors for a classi"cation tree. Two classi"ers are combined and therefore method and variable selection bias is no problem for the corresponding estimate of misclassi"cation error, the need of an additional test sample disappears. Moreover, the procedure performs comparable to the best classi"ers used in a number of arti"cial examples and applications. ? 2002 Pattern Recognition Society. Published by Elsevier Science Ltd. All rights reserved.
منابع مشابه
On the stability of support vector machines for face detection
In this paper we study the stability of support vector machines in face detection by decomposing their average prediction error into the bias, variance, and aggregation effect terms. Such an analysis indicates whether bagging, a method for generating multiple versions of a classifier from bootstrap samples of a training set, and combining their outcomes by majority voting, is expected to improv...
متن کاملDistributed learning with bagging - like performance 3
10 Bagging forms a committee of classifiers by bootstrap aggregation of training sets from a pool of training data. A 11 simple alternative to bagging is to partition the data into disjoint subsets. Experiments with decision tree and neural 12 network classifiers on various datasets show that, given the same size partitions and bags, disjoint partitions result in 13 performance equivalent to, o...
متن کاملCombining Classifiers based on Gaussian Mixtures
A combination of classification rules (classifiers) is known as an Ensemble, and in general it is more accurate than the individual classifiers used to build it. Two popular methods to construct an Ensemble are Bagging (Bootstrap aggregating) introduced by Breiman, [4] and Boosting (Freund and Schapire, [11]). Both methods rely on resampling techniques to obtain different training sets for each...
متن کاملDistributed learning with bagging-like performance
Bagging forms a committee of classifiers by bootstrap aggregation of training sets from a pool of training data. A simple alternative to bagging is to partition the data into disjoint subsets. Experiments with decision tree and neural network classifiers on various datasets show that, given the same size partitions and bags, disjoint partitions result in performance equivalent to, or better tha...
متن کاملCombining Bagging and Boosting
Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Pattern Recognition
دوره 36 شماره
صفحات -
تاریخ انتشار 2003